Low-rank Tensor Approximation
نویسندگان
چکیده
Approximating a tensor by another of lower rank is in general an ill posed problem. Yet, this kind of approximation is mandatory in the presence of measurement errors or noise. We show how tools recently developed in compressed sensing can be used to solve this problem. More precisely, a minimal angle between the columns of loading matrices allows to restore both existence and uniqueness of the best low rank approximation. We then show how these results can be applied to perform jointly localization and extraction of multiple sources from the measurement of a noisy mixture recorded on multiple sensors, in an entirely deterministic manner. The main interest in deterministic approaches is that they can be followed in the presence of strong channel nonstationarities.
منابع مشابه
On the Tensor Svd and Optimal Low Rank Orthogonal Approximations of Tensors
Abstract. It is known that a high order tensor does not necessarily have an optimal low rank approximation, and that a tensor might not be orthogonally decomposable (i.e., admit a tensor SVD). We provide several sufficient conditions which lead to the failure of the tensor SVD, and characterize the existence of the tensor SVD with respect to the Higher Order SVD (HOSVD) of a tensor. In face of ...
متن کاملTensor Low Multilinear Rank Approximation by Structured Matrix Low-Rank Approximation
We present a new connection between higherorder tensors and affinely structured matrices, in the context of low-rank approximation. In particular, we show that the tensor low multilinear rank approximation problem can be reformulated as a structured matrix low-rank approximation, the latter being an extensively studied and well understood problem. We first consider symmetric tensors. Although t...
متن کاملNew Ranks for Even-Order Tensors and Their Applications in Low-Rank Tensor Optimization
In this paper, we propose three new tensor decompositions for even-order tensors corresponding respectively to the rank-one decompositions of some unfolded matrices. Consequently such new decompositions lead to three new notions of (even-order) tensor ranks, to be called the M-rank, the symmetric M-rank, and the strongly symmetric M-rank in this paper. We discuss the bounds between these new te...
متن کاملLow-Rank Approximation and Completion of Positive Tensors
Unlike the matrix case, computing low-rank approximations of tensors is NP-hard and numerically ill-posed in general. Even the best rank-1 approximation of a tensor is NP-hard. In this paper, we use convex optimization to develop polynomial-time algorithms for low-rank approximation and completion of positive tensors. Our approach is to use algebraic topology to define a new (numerically well-p...
متن کاملOn the Tensor SVD and the Optimal Low Rank Orthogonal Approximation of Tensors
It is known that a higher order tensor does not necessarily have an optimal low rank approximation, and that a tensor might not be orthogonally decomposable (i.e., admit a tensor SVD). We provide several sufficient conditions which lead to the failure of the tensor SVD, and characterize the existence of the tensor SVD with respect to the Higher Order SVD (HOSVD). In face of these difficulties t...
متن کاملDynamical Approximation by Hierarchical Tucker and Tensor-Train Tensors
We extend results on the dynamical low-rank approximation for the treatment of time-dependent matrices and tensors (Koch & Lubich, 2007 and 2010) to the recently proposed Hierarchical Tucker tensor format (HT, Hackbusch & Kühn, 2009) and the Tensor Train format (TT, Oseledets, 2011), which are closely related to tensor decomposition methods used in quantum physics and chemistry. In this dynamic...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2011